Nonnegative matrix factorization and I-divergence alternating minimization
نویسندگان
چکیده
منابع مشابه
Nonnegative Matrix Factorization and I-Divergence Alternating Minimization
In this paper we consider the Nonnegative Matrix Factorization (NMF) problem: given an (elementwise) nonnegative matrix V ∈ R + find, for assigned k, nonnegative matrices W ∈ R + and H ∈ R k×n + such that V = WH . Exact, non trivial, nonnegative factorizations do not always exist, hence it is interesting to pose the approximate NMF problem. The criterion which is commonly employed is I-divergen...
متن کاملApproximate Nonnegative Matrix Factorization via Alternating Minimization
In this paper we consider the Nonnegative Matrix Factorization (NMF) problem: given an (elementwise) nonnegative matrix V ∈ R + find, for assigned k, nonnegative matrices W ∈ R + and H ∈ R k×n + such that V = WH . Exact, non trivial, nonnegative factorizations do not always exist, hence it is interesting to pose the approximate NMF problem. The criterion which is commonly employed is I-divergen...
متن کاملNonnegative Matrix Factorization with the β-Divergence”
The equivalence can be formalized as follows: For a particular c in (21), there is a corresponding δ > 0 in the optimization in (A-1). We focus on `1-ARD where f(x) = ‖x‖1. Then the objective is concave in H. One natural way to solve (A-1) iteratively is to use an MM procedure by upper bounding the objective function with its tangent (first-order Taylor expansion) at the current iterate H. This...
متن کاملKullback-Leibler Divergence for Nonnegative Matrix Factorization
The I-divergence or unnormalized generalization of KullbackLeibler (KL) divergence is commonly used in Nonnegative Matrix Factorization (NMF). This divergence has the drawback that its gradients with respect to the factorizing matrices depend heavily on the scales of the matrices, and learning the scales in gradient-descent optimization may require many iterations. This is often handled by expl...
متن کاملProjective Nonnegative Matrix Factorization with α-Divergence
A new matrix factorization algorithm which combines two recently proposed nonnegative learning techniques is presented. Our new algorithm, α-PNMF, inherits the advantages of Projective Nonnegative Matrix Factorization (PNMF) for learning a highly orthogonal factor matrix. When the Kullback-Leibler (KL) divergence is generalized to αdivergence, it gives our method more flexibility in approximati...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Linear Algebra and its Applications
سال: 2006
ISSN: 0024-3795
DOI: 10.1016/j.laa.2005.11.012